专利摘要:
The present invention relates to a mobile terminal (151) and its control method according to which saved image data can be efficiently searched. The present invention includes a memory (170) configured to store at least one image data, a touch screen (151) and a controller (180) outputting a thumbnail list for image data stored at a first scale by the through the touch screen (150). When the controller (180) receives a scroll input several times for the output thumbnail list, if the scroll input received several times satisfies a scaling condition, then the controller (180) changes the first scale of the thumbnail list output in a 2nd scale.
公开号:FR3022648A1
申请号:FR1554364
申请日:2015-05-13
公开日:2015-12-25
发明作者:Jinuh Joo;Sanghun Joo;Eunshin Kim
申请人:LG Electronics Inc;
IPC主号:
专利说明:

[0001] The present invention relates to a mobile terminal, and more particularly, to a mobile terminal and its control method. Although the present invention is suitable for a wide range of applications, it is particularly suitable for facilitating the use of a terminal by further considering the convenience of the user. A mobile terminal is a device that can be configured to perform various functions. Examples of such functions include data communications and voice communications, image and video capture via camera, audio recording, music file playback, and music playback via a high-level system. speakers, and displaying images and video on a display device. Some terminals have an additional feature that supports games while other terminals are also configured as media players. More recently, mobile terminals have been configured to receive nonselective / selective broadcast signals that can display content, such as videos and television programs. Generally, terminals can be classified as mobile terminals and stationary terminals according to a presence or non-presence of mobility. And, the mobile terminals can be classified further as portable and teiniinals embedded in vehicles according to the availability of portability. There are ongoing efforts to support and increase the functionality of mobile devices. Such efforts include software and hardware enhancements as well as changes and improvements to the structural components that make up the mobile heldinal. A mobile terminal provides a control method for backing up a large amount of image data and for reading the saved image data. A control method for reading backed up image data is a general method such as that described hereinafter. First, a thumbnail list of saved image data is output. Second, a user searches for desired image data through an image preview of 3022648 2 elementary thumbnails. Finally, if a prescribed item (i.e., desired picture data to be found) is selected in the thumbnail list, the image data corresponding to the selected item can be displayed in detail. However, if the number of saved image data increases, the method described above may not be appropriate. As a result, the demand for searching and developing a control method for more efficient image data search continues to grow. Accordingly, embodiments of the present invention relate to a mobile terminal and its control method which substantially overcomes one or more problems due to limitations and disadvantages of the related art. The present invention relates to a mobile terminal and its control method with which it is possible to search easily and practically image data. Technical tasks achievable from the present invention are not limited to the aforementioned technical tasks. And, other technical tasks not mentioned can be clearly understood from the following description by those skilled in the art to which the present invention is directed. Additional advantages, objects and features of the invention will be presented in this presentation as well as the accompanying drawings. Such aspects may also be appreciated by those skilled in the art on the basis of this presentation. To accomplish these and other objects and the purpose of the invention, as embodied and fully described herein, a mobile terminal according to an embodiment of the present invention may include a memory 25 configured to store at least one image data, a touch screen, and a controller outputting a thumbnail list for the image data stored at a scale-up via the touch screen, the controller receiving multiple scrolling input for the list of thumbnails output, and if the scroll input received several times satisfies a scaling condition, the controller 30 changing the first scale of the output thumbnail list to a 2 'scale. Preferably, if a prescribed touch gesture is applied to the output thumbnail list, the controller may display a detailed display screen of a prescribed image data group instead of outputting the thumbnail list.
[0002] More preferably, the prescribed tactile gesture may include a touch input made to apply a touch to a prescribed thumbnail and then to slide in a right-to-left direction with touch retention.
[0003] And, the group of prescribed image data may comprise a set of at least one image data item having the same attribute as the prescribed elementary vignette touched. Preferably, the controller can assign the stored image data to at least one group. If the scroll entry received multiple times satisfies a grouping change condition, the controller may output a list of groups for the at least one group instead of outputting the thumbnail list. More preferably, if a touch swipe input is applied to a prescribed elementary group of the group list, the controller may display a preview of at least one image data assigned to the prescribed elementary group on a path of the touch swipe input. More preferably, the grouping change condition may include a condition to further satisfy the scaling condition during the output of the thumbnail list to a prescribed scale. Preferably, the scaling condition may include a condition that a scroll entry receipt count is equal to or greater than a prescribed count and / or a condition than a scroll input speed. equal to or greater than a prescribed speed. Preferably, the mobile terminal may further comprise a camera. If the scale of the thumbnail list is changed to the 2nd scale, the controller can automatically activate the camera. Preferably, the mobile terminal may further include a microphone. The controller can recognize a voice received through the microphone. The controller may search for keyword information of image data stored on the basis of the recognized voice. And, the controller can output the image data found through the touch screen. In another aspect of the present invention, as embodied and amply described herein, a method of controlling a mobile terminal according to another embodiment of the present invention may comprise the following steps: at least one image data; outputting a thumbnail list for image data stored on a scale via a touch screen; repeatedly receiving a scroll entry for the output thumbnail list; if the scroll input received several times satisfies a scaling condition, change the first scale of the thumbnail list output to a scale. Effects achievable by the present invention may not be limited by the aforementioned effect. And, other effects not mentioned can be clearly understood from the following description by those skilled in the art to which the present invention is directed. It is to be understood that the foregoing general description and the following detailed description of the preferred embodiments of the present invention are given by way of example and for the purpose of explanation, and with the intention of providing a further explanation of the present invention. invention as specified in the claims. Other features and advantages of the invention will emerge more clearly from a reading of the following description, made with reference to the appended drawings, which are given by way of illustration only, which in no way limit the present invention and on which Fig. 1A is a block diagram of a mobile terminal according to the present invention; FIGS. 1B and 1C are conceptual views of an example of the mobile terminal, viewed in different directions; Fig. 2 is a flowchart for a method of automatically adjusting a scale of a thumbnail list according to an embodiment of the present invention; Figs. 3A, 3B, 3C and 3D are state diagrams for an automatic scale adjustment control method of a thumbnail list according to an embodiment of the present invention; Figs. 4A, 4B and 3C are diagrams for an example of automatically changing a type of display of a ladder list into a list of group types according to an embodiment of the present invention; FIGS. 5A and 5B are diagrams for an automatic face recognition control method and then assisting for recognized face image data search according to an embodiment of the present invention. ; Figs. 6A and 6B are diagrams for a method of assisting control of an image data search using speech recognition according to an embodiment of the present invention; Figs. 7A and 7B are diagrams for a method of controlling output of detailed image data during the output of a thumbnail list 10 according to an embodiment of the present invention; Figs. 8A, 8B and 8C are diagrams for an example of a list of types of groupings according to an embodiment of the present invention; Figs. 9A, 9B and 9C are diagrams for a control method for easy reading of image data included in a group vignette according to an embodiment of the present invention; Figs. 10A and 10B are diagrams for a control method for a case of selecting a displayed image data according to an embodiment of the present invention; Figs. 11A, 11B, and 11C are diagrams for a method of controlling read image data on a card according to an embodiment of the present invention; Figs. 12A, 12B, and 12C are diagrams for a method of controlling change of a scale of a display type in response to a scroll input on a news item playback screen in accordance with a Embodiment of the present invention; Figs. 13A, 13B and 13C are diagrams for a method of controlling change of a scale or type of display in response to a scroll input on a mail list screen according to an embodiment of the present invention; and Figs. 14A and 14B are diagrams for a search word input control method for searching image data through a touch input according to an embodiment of the present invention.
[0004] The present invention will be described above in detail according to embodiments given by way of example with reference to the accompanying drawings. In order to have a brief description with reference to the drawings, like or equivalent components may have the same reference numbers and their description will not be repeated. In general, a generic term such as "module" and "unit" can be used to refer to elements or components. Such a generic term is used here primarily with the intention of facilitating the description of the specification, and the generic term itself is not mentioned with the intention of giving a special meaning or function. In this presentation, what is well known to those skilled in the art has generally been omitted for the sake of brevity. The accompanying drawings serve to facilitate understanding of the various technical features and it should be understood that the embodiments presented herein are not limited by the accompanying drawings. Thus, this presentation should be construed as an extension to all equivalents, modifications and substitutes in addition to those which are particularly shown in the accompanying drawings. It must be understood that although the terms first, second, and so on. can be used here to describe various elements, these elements should not be limited by these terms. These terms are generally used only to distinguish one element from another. It should be understood that when one element is referenced as "connected to" another element, the element may be connected to the other element or intervening elements may also be present. On the contrary, when an element is referenced as being "directly connected to" another element, there are no intervening elements present. A single representation may have a multiple representation unless it represents a meaning that is definitely different from the context. Terms such as "include" or "have" are used herein and it should be understood that they are used with the intention of indicating an existence of several components, functions or steps, presented in this memo, and it is also necessary to understand that more or fewer components, functions or steps can be likely used.
[0005] Mobile terminals presented here can be implemented using a large number of different types of terminals. Examples of such terminals include cell phones, smart phones, user equipment, laptops, digital broadcast terminals, personal digital assistants (PDAs), portable media players (PMPs), browsers, personal computers (PCs), electronic slates, electronic tablets, electronic books, handheld devices (eg, smart watches, electronic glasses, faceplates (HMD)), and the like.
[0006] By means of a non-limiting example only, the description will be made with reference to particular types of mobile terminals. However, such teachings also apply to other types of terminals, such as the types mentioned above. In addition, these teachings can also be applied to stationary terminals, such as digital TVs, desktops and similar devices. Reference will now be made to FIGS. 1A to 1C, of which FIG. 1A is a block diagram of a mobile terminal according to the present invention, while FIGS. 1B and 1C are conceptual views of an example of the mobile terminal, seen according to FIG. different directions.
[0007] The mobile terminal 100 is presented with components such as a wireless communication unit 110, an input unit 120, a detection unit 140, an output unit 150, an interface unit 160, a memory 170 A controller 180 and a power supply unit 190. It is to be understood that the implementation of all illustrated components is not an obligation and that, alternatively, more or less components may be implemented. Referring now to FIG. 1A, the mobile terminal 100 is presented with a wireless communication unit 110 configured with several components implemented in common. For example, the wireless communication unit 110 typically includes one or more components that allow wireless communication between the mobile terminal 100 and a wireless communication system or network in which the mobile terminal is located. The wireless communication unit 110 typically includes one or more modules that enable communications such as wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another terminal. mobile, communications between the mobile terminal 100 and an external server. In addition, the wireless communication unit 110 typically includes one or more modules that connect the mobile terminal 100 to one or more networks. To facilitate such communications, the wireless communication unit 110 comprises one or more of the following modules: a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short communication module scope 114 and a location information module 115.
[0008] The input unit 120 includes a camera 121 for obtaining images or videos, a microphone 122 which is a type of audio input device for inputting an audio signal and a user input unit 123 (For example, a touch key, a push button, a mechanical key, a soft key and similar keys) allowing the user to enter information. Data (eg, audio, video, image, etc.) is obtained by the input unit 120 and can be analyzed and processed by the controller 180 according to device parameters, user commands, and combinations. of these. The detection unit 140 is typically implemented using one or more sensors configured to detect internal information of the mobile terminal, the surrounding environment of the mobile terminal, user information, and other similar information. For example, in FIG. 1A, the detection unit 140 is shown provided with a proximity sensor 141 and a lighting sensor 142. If desired, the detection unit 140 may alternatively or in addition include other types of sensors or devices, such as a touch sensor, an acceleration sensor, a magnetic sensor, a gravity sensor, a gyro sensor, a motion sensor, an RGB sensor, an infrared sensor ( ER), a fingerprint sensor, an ultrasonic sensor, an optical sensor (e.g., camera 121), a microphone 122, a battery meter, an environmental sensor (e.g., a barometer, a hygrometer , a thermometer, a radiation detection sensor, a thermal probe, and a gas sensor, among other sensors) and a chemical sensor (e.g., an electronic nose, a health care sensor, a biometric sensor and similar sensors) for in 3022648 9 name that lques few. The mobile terminal 100 may be configured to use the information obtained from the detection unit 140 and, in particular, the information obtained from one or more sensors of the detection unit 140 and combinations thereof.
[0009] Output unit 150 is typically configured to output various types of information, such as audio, video, touch, etc. outputs. The output unit 150 is shown having a display unit 151, an audio output module 152, a haptic module 153 and an optical output module 154. The display unit 151 may have a multilayered structure or integrated structure with a touch sensor to make a touch screen possible. The touch screen may provide an output interface between the mobile terminal 100 and a user, and may also function as a user input unit 123 which provides an input interface between the mobile terminal 100 and the user. user. The interface unit 160 serves as an interface with various types of external devices that can be coupled to the mobile terminal 100. The interface unit 160, for example, may have wired or wireless ports, and / or external power ports and / or wired or wireless data ports, and / or memory card ports and / or connection ports of a device having an identification module and / or ports of audio input / output (I / O) and / or video I / O ports and / or earphone ports and / or similar ports. In some cases, the mobile terminal 100 may perform matching control functions, associated with a connected external device, in response to the external device connected to the interface unit 160. The memory 170 is typically implemented to store data in order to store the data. to support various functions or features of the mobile terminal 100. For example, the memory 170 may be configured to store application programs executed in the mobile terminal 100, data or instructions for operations of the mobile terminal 100, etc. . Some of these application programs can be downloaded from an external server via wireless communication. Other application programs may be installed in the mobile terminal 100 at the time of its manufacture or shipment, which is typically the case for basic functions of the mobile terminal 100 (for example, receiving a call, establishing a call, receiving a message, sending a message, etc.). It is common for application programs to be stored in the memory 170, to be installed in the mobile terminal 100 and to be executed by the controller 180 to perform an operation (or function) for the mobile terminal. 100.
[0010] The controller 180 typically operates to control all operations of the mobile terminal 100, in addition to the operations associated with the application programs. The controller 180 can provide or process appropriate information or functions for a user by processing signals, data, information, etc. which are input or output through the various components shown in Fig. 1A, or by activation of the application programs stored in memory 170. For example, controller 180 controls some or all of the components illustrated in Figs. 1A to 1C according to the execution of an application program that has been stored in the memory 170. The power supply unit 190 may be configured to receive external energy or provide internal power to provide an appropriate power required for operate elements and components included in the mobile terminal 100. The power supply unit 190 may include a battery and the battery may be configured to be integrated into the terminal body or may be configured to be detachable from the body of the terminal. terminal.
[0011] Still with reference to Figure 1A, various components shown in this figure will now be described in more detail. With respect to the wireless communication unit 110, the broadcast receiving module 111 is typically configured to receive a broadcast signal and / or information associated with a broadcast of an external broadcast management entity via a channel of diffusion. The broadcast channel may include a satellite channel, a terrestrial channel, or both. In some embodiments, two or more broadcast receiving modules 111 may be used to simultaneously facilitate the reception of two or more broadcast channels, or to support switching between broadcast channels.
[0012] The mobile communication module 112 may transmit wireless signals to and / or receive from one or more network entities. Typical examples of a network entity include a base station, an external mobile terminal, a server, and so on. Such network entities form a part of a mobile communication network, which is built according to technical standards or communication methods for mobile communications (for example: global telecommunication system with mobiles (GSM), access code division multiple (CDMA), code division multiple access code (CDMA2000) 5, evolution-only or optimized data (EV-DO), code division multiple access (WCDMA), downstream access to high-speed packet mode (HSDPA), high-speed packet mode uplink access (HSUPA), long-term evolution (LTE), advanced long-term evolution (LTE-A) and similar processes). Examples of wireless signals transmitted and / or received via the mobile communication module 112 include audio call signals, video (telephony) call signals, or various data formats to support the communication of data. text and multimedia messages. The wireless Internet module 113 is configured to facilitate wireless access to the Internet. This module may be internally or externally coupled to the mobile terminal 100. The wireless Internet module 113 may transmit and / or receive wireless signals over communication networks using wireless Internet technologies. Examples of such wireless access to the Internet include Wireless Local Area Networks (WLAN), Wi-Fi, Wi-Fi Direct, Digital Living Network Alliance (DLNA), Wireless Broadband (WiBro), Worldwide Interoperability for Microwave 20 Access (WiMAX), High Speed Packet Access (HSDPA), High Speed Packet Access (HSUPA), Long Term Evolution (LTE), Advanced Long Term Evolution (LTE-A) and similar processes. The wireless Internet module 113 may transmit / receive data according to one or more of said wireless Internet technologies and also according to other Internet technologies. In some embodiments, when wireless access to the Internet is implemented according to, for example WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE, LTE-A, etc., as part of a mobile communication network, the wireless Internet module 113 achieves such wireless access to the Internet. Thus, the Internet module 113 can cooperate with the mobile communication module 112 or operate as a mobile communication module 112. The short-range communication module 114 is configured to make possible short-range communications. Suitable technologies for implementing such short-range communications include BLUETOOTHTM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra-WideBand (UWB), ZigBee, Near Field Communication (NFC), Wi-Fi , Wi-Fi Direct, Wireless USB, etc. The short-range communication module 114 generally supports wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal 100, or communications between the terminal. mobile and a network in which is located another mobile terminal 100 (or an external server), via wireless networks. An example 10 of the wireless networks is a wireless personal network. In some embodiments, another mobile terminal (which may be configured similarly to the mobile terminal 100) may be a portable device, for example, a smart watch, electronic glasses or a face display (HMD), which is capable of to exchange data with the mobile terminal 100 (or else to cooperate with the mobile terminal 100). The short-range communication module 114 can detect or recognize the portable device and allow communication between the portable device and the mobile terminal 100. In addition, when the detected portable device is a device that is authenticated to communicate with the mobile terminal 100 the controller 180 may, for example, cause the transmission of processed data in the mobile terminal 100 to the portable device via the short-range communication module 114. Thus, a user of the portable device may use the portable device processed data in the mobile terminal 100. For example, when a call is received in the mobile terminal 100, the user can answer the call using the portable device. In addition, when a message is received in the mobile terminal 100, the user can check the received message using the handheld device. The location information module 115 is generally configured to detect, calculate, derive or otherwise identify a position of the mobile terminal. By way of example, the location information module 115 comprises a GPS module 30 (Global Positioning System), a Wi-Fi module or both. If desired, the location information module 115 may alternatively or additionally operate with any of the other modules of the wireless communication unit 110 to obtain data relating to the position of the mobile terminal.
[0013] By way of example, when the mobile terminal uses a GPS module, a position of the mobile terminal can be acquired by means of a signal sent by a GPS satellite. In another example, when the mobile terminal uses the Wi-Fi module, a position of the mobile terminal can be acquired based on information about a wireless access point (AP) that transmits a wireless signal to the module. Wi-Fi or receives one from the latter. The input unit 120 may be configured to allow various types of inputs to the mobile terminal 120. Examples of such inputs include audio content inputs, image and video inputs, data inputs, and data inputs. user inputs. An image and video input is often obtained using one or more cameras 121. Such cameras 121 can process still or moving image frames obtained by image sensors in a capture mode. image or video. The processed image frames may be displayed on the display unit 151 or stored in the memory 170. In some cases, the cameras 121 may be arranged in a matrix configuration to allow the mobile terminal 100 to enter a plurality. images with various angles or focal points. In another example, the cameras 121 may be placed in a stereoscopic arrangement to acquire left and right images for implementation in a stereoscopic image.
[0014] The microphone 122 is generally implemented to allow audio input into the mobile terminal 100. The audio input can be processed in various ways according to a function being performed in the mobile terminal 100. If desired, the microphone 122 may comprise noise suppressing algorithms for suppressing unwanted noises generated during reception of external audio signals. The user input unit 123 is a component that allows input by a user. Such user input may allow the controller 180 to control an operation of the mobile terminal 100. The user input unit 123 may include one or more mechanical input elements (e.g., a key, a button located on a front and / or rear surface or on a side surface of the mobile terminal 100, a dome switch, a thumb wheel, a push button, etc.), or a touch input, among other elements. For example, the touch input may be a virtual key or a softkey that is displayed on a touch screen 3022648 14 by software processing, or a touch key that is located on the mobile terminal at a location that is other than the touch screen. On the other hand, the virtual key or the visual key may be displayed on the touch screen in a variety of forms, for example, a graphic, text, icon, video, or combination thereof. The detection unit 140 is generally configured to detect one or more of the internal information of the mobile terminal, information of the surrounding environment of the mobile terminal, user information or similar information. The controller 180 generally cooperates with the transmit unit 140 to control an operation of the mobile terminal 100 or execute a data processing, function, or operation associated with an application program installed in the mobile terminal on the basis of the detection provided by the detection unit 140. The detection unit 140 may be implemented using any of a large number of sensors some of which will now be described in more detail. The proximity sensor 141 may include a sensor for detecting the presence or absence of an object approaching a surface or object located near a surface, using an electromagnetic field , infrared rays or similar means without mechanical contact. The proximity sensor 141 may be arranged in an inner region of the mobile terminal covered by the touch screen or near the touch screen. The proximity sensor 141, for example, may include a transmitter type photoelectric sensor or a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation proximity sensor, a sensor capacitive type proximity switches, a magnetic type proximity sensor, an infrared proximity sensor, and similar sensors. When the implemented touch screen is of the capacitive type, the proximity sensor 141 can detect a proximity of a pointer relative to the touch screen by variations of an electromagnetic field that is sensitive to an approach of an object presenting a conductivity. In this case, the touch screen (touch sensor) can also be categorized as a proximity sensor 3022648 15 The term "proximity touch" will often be referenced here to refer to the scenario in which a pointer is positioned to be close to the touch screen without being in touch with the touch screen. The term "contact touch" will often be referenced herein to refer to the scenario in which a pointer produces a physical contact with the touch screen. For the position corresponding to the proximity touch of the pointer relative to the touch screen, such a position will correspond to a position where the pointer is perpendicular to the touch screen. The proximity sensor 141 can detect a proximity touch and proximity touch parameters (e.g., distance, direction, speed, time, position, state of motion, etc.). In general, the controller 180 processes data corresponding to the proximity touch and proximity touch parameters detected by the proximity sensor 141, and causes visual information to be output to the touch screen. In addition, the controller 180 may control the mobile terminal 100 to perform different operations or to process different data depending on whether a touch relative to a point on the touch screen is either a proximity touch or a touch contact. A touch sensor may detect a touch applied to the touch screen, such as a display unit 151, by any one of a variety of tactile input methods. Examples of such touch input methods include a resistive type, a capacitive type, an infrared type and a magnetic type, among other types. By way of example, the touch sensor may be configured to convert pressure changes applied to a specific portion of the display unit 151, or to convert a capacitance appearing at a specific portion of the display unit. 151, in electrical input signals. The touch sensor can also be configured to detect not only an affected position and an affected area, but also a contact pressure and / or a contact capacitance. A contact object is generally used to apply a touch input to the touch sensor. Examples of typical contact objects include a finger, a pencil, a stylus, a pointer, or a similar object. When a touch input is detected by a touch sensor, corresponding signals can be transmitted to a touch controller. The touch controller 3022648 16 can process the received signals and then transmit corresponding data to the controller 180. Accordingly, the controller 180 can detect the region of the display unit 151 that has been touched. Here, the touch controller may be a separate component of the controller 180, the controller 180, and combinations of both.
[0015] In some embodiments, the controller 180 may execute the same or different commands depending on a type of touch object that touches the touch screen or a touch pad type provided in addition to the touch screen. The execution of the same command or a different command depending on the object which provides a touch input can be decided, for example, on the basis of a current state of operation of the mobile terminal 100 or a remote control program. application currently running. The touch sensor and the proximity sensor can be implemented, individually or in combination, to detect various types of touch. Such touches include a short touch (or small tap), a long touch, a multiple touch, a slipped touch, a touch with a tweak, a pinch, a touch away, a touch sweeping, a hovering touch, etc. If desired, an ultrasonic sensor may be implemented to recognize positional information relating to a contact object using ultrasonic waves. The controller 180, for example, can calculate a position of a wave generating source based on the information detected by a light sensor and a plurality of ultrasonic sensors. Since light is much faster than ultrasonic waves, the time it takes the light to reach the optical sensor is much shorter than the time it takes the ultrasonic wave to reach the ultrasonic sensor. The position of the wave generator source 25 can be calculated with the aid of this fact. For example, the position of the wave generating source can be calculated using the time difference from the time the ultrasonic wave makes to reach the sensor based on the light serving as a reference signal. The camera 121 typically includes at least one camera sensor (CCD, CMOS, etc.), a photographic sensor (or image sensors) and a laser sensor. The implementation of the camera 121 with a laser sensor can enable detection of a contact of a physical object with respect to a 3D stereoscopic image. The photographic sensor may be laminated to the display or partially covering the display. The photographic sensor may be configured to analyze motion of the physical object near the touch screen. In more detail, the photographic sensor may include lines and columns of photodiodes and phototransistors for analyzing content received in the photographic sensor by an electrical signal that varies with the amount of light applied. More precisely, the photographic sensor can calculate the coordinates of the physical object according to a variation of light so as to obtain position information of the physical object.
[0016] The display unit 151 is generally configured to output processed information in the mobile terminal 100. For example, the display unit 151 may display run screen information of an application program. executing on the mobile terminal 100 or user interface (UI) information and graphical user interface (GUI) information in response to the information for the execution screen. In some embodiments, the display unit 151 may be implemented as a stereoscopic display unit for displaying stereoscopic images. A typical stereoscopic display unit may employ a stereoscopic display system such as a stereoscopic system (a system with glasses), an auto-stereoscopic system (a system without glasses), a projection system (holographic system) or similar systems. The audio output module 152 is generally configured to output audio data. Such audio data may be obtained from any one of a number of different sources so that the audio data may be received from the wireless communication unit 110 or may have been stored in the memory. 170. The audio data may be output during such modes as a signal receiving mode, a calling mode, a recording mode, a voice recognition mode, a broadcast receiving mode and similar modes. The audio output module 152 may provide an audible output relating to a particular function (e.g., a call signal receiving tone, a message receiving tone, etc.) performed by the mobile terminal 100. Audio output 152 may also be implemented as a receiver, loudspeaker, buzzer, or similar device. A haptic module 153 may be configured to generate various tactile effects that a user feels, perceives or otherwise experiences. A typical example of a tactile effect generated by the haptic module 153 is a vibration. Intensity, shape, etc. the vibration generated by the haptic module 153 can be controlled by a user selection or by a controller setting. For example, the haptic module 153 may emit different vibrations in a combined manner or in a sequential manner.
[0017] In addition to a vibration, the haptic module 153 can generate various other tactile effects, including a stimulating effect such as an arrangement of vertically movable pins for touching the skin, a jet force or suction force. air through a jet orifice or suction opening, contact on the skin, electrode contact, electrostatic force, effect by reproducing the feeling of cold and heat using an element that can absorb or generate heat, etc. The haptic module 153 can also be implemented to allow the user to feel a tactile effect through a muscular sensation such as the fingers or the arm of the user, as well as to transfer the tactile effect by the user. through direct contact. Two or more haptic modules 153 may be present depending on the particular configuration of the mobile terminal 100. An optical output module 154 may output a signal to indicate generation of an event using light from a light source. Examples of events generated in the mobile terminal 100 may include receiving a message, receiving a call signal, a missed call, an alarm, a scheduling deadline, an e-mail reception, receiving information through an application, etc. A signal outputted from the optical output module 154 may be implemented in such a manner that the mobile terminal emits monochromatic light or light with a plurality of colors. The output signal can be stopped, for example, when the mobile terminal detects that a user has controlled the generated event.
[0018] The interface unit 160 serves as an interface for external devices to connect to the mobile terminal 100. For example, the interface unit 160 can receive data transmitted by an external device, receive energy from transfer to elements and components in the mobile terminal 100, or transmit internal data of the mobile terminal 100 to such an external device. The interface unit 160 may include wired or wireless headphone-microphone ports, external power ports, wired or wireless data ports, memory card ports, ports for connecting a device having a identification module, input / output (I / O) ports, video I / O ports, headphone ports, or similar ports. The identification module may be a chip that stores various information for authenticating authorization to use the mobile terminal 100 and may include a user identity module (UIM), a subscriber identity module (SIM) , a Universal Subscriber Identity Module (USIM) and similar modules. In addition, the device having the identification module (also referred to herein as "identification device") may take the form of a smart card. As a result, the identification device can be connected to the terminal 100 via the interface unit 160. When the mobile terminal 100 is connected to an external support, the interface unit 160 can be used as a gateway to allow power from the medium may be supplied to the mobile terminal 100 or it may be used to allow various control signals input by the user to be transferred from the medium to the mobile terminal through the communication unit. 'interface. Various control signals or energy input from the medium may function as signals to recognize that the mobile terminal is properly mounted on the medium. The memory 170 may store programs to perform controller 180 operations and store input / output data (eg, a phone book, messages, still images, videos, etc.). The memory 170 can store data relating to various forms of vibration and audio sounds that are output in response to touch inputs on the touch screen. The memory 170 may comprise one or more types of storage media comprising a flash memory, a hard disk, an electronic disk, a silicon disk, a micro type of multimedia card, a card-type memory (for example, memory). SD or DX, etc.), random access memory (RAM), static memory (SRAM), read only memory (ROM), erasable and electrically programmable read only memory (EEPROM), programmable read only memory (PROM), magnetic memory, a magnetic disk, an optical disk and similar memory devices. The mobile terminal 100 may also be used in connection with a network storage device which performs the storage function of the memory 170 in a network such as the Internet. The controller 180 may typically control the general operations of the mobile terminal 100. For example, the controller 180 may enable or disable a lockout state to prevent a user from entering a control command for applications when a state of the mobile terminal satisfies a predefined condition. The controller 180 may also perform control and processing associated with voice calls, data communications, video calls, etc., or perform pattern recognition processing to recognize a handwritten input or an entry of drawings of an image made on the touch screen in the form of characters or images respectively. In addition, the controller 180 may control one or a combination of these components to implement various embodiments given herein by way of example. The power supply unit 190 receives external energy or has internal energy and provides the appropriate energy required to operate respective elements and components included in the mobile terminal 100. The power supply unit 190 can have a battery which is typically rechargeable or can be detachably coupled to the terminal body for charging. The power supply unit 190 may include a connection port. The connection port may be configured as one of the examples of the interface unit 160 to which an external charger is electrically connected to provide power for recharging the battery. In another example, the power supply unit 190 may be configured to recharge the battery in a wireless manner without using the connection port. In this example, the power supply unit 190 can receive the energy transferred from an external wireless energy transmitter by at least one inductive coupling method which is based on magnetic induction and / or a magnetic resonance coupling method which is based on electromagnetic resonance.
[0019] Various embodiments described herein may be implemented on a computer readable medium, on a machine readable medium or on a similar medium using, for example, software components, hardware components or any combination thereof. Referring now to FIGS. 1B and 1C, the mobile terminal 100 is described with reference to a bar-type terminal body. However, the mobile terminal 100 may alternatively be implemented in any of a large number of different configurations. Examples of such configurations include a watch type, a clip type, an eyewear type or as a folding type, a folding type, a sliding type, an oscillating type and a pivoting type in which two or more bodies are combined. with each other in a relatively mobile manner, and combinations of these types. The description will often relate to a particular type of mobile terminal (for example, a bar type, a watch type, a glasses type, etc.). However, such teachings regarding a particular type of mobile terminal will generally also be applied to other types of mobile terminals. The mobile terminal 100 will generally comprise a housing (for example, frame, box, cover, etc.) forming the appearance of the terminal. In this embodiment, the housing is formed using a front housing 101 and a rear housing 102. Various electronic components are incorporated in a space formed between the front housing 101 and the rear housing 102. less a central housing can be further positioned between the front housing 101 and the rear housing 102. The display unit 151 is shown located on the front of the terminal body to output information. As shown in the figure, a window 151a of the display unit 151 may be mounted on the front housing 101 to form the front surface of the terminal body together with the front housing 101. In some embodiments, components The electronics may also be mounted on the back box 102. Examples of such electronic components include a detachable battery 191, an identification module, a memory card, and the like. A rear cover 103 is shown covering the electronic components, and this cover can be releasably coupled to the rear housing 102. Therefore, when the rear cover 103 is detached from the rear housing 102, the electronic components mounted on the rear housing 102 are exposed. As shown in the figure, when the rear cover 103 is coupled to the rear housing 102, a side surface of the rear housing 102 is partially uncovered. In some cases, during coupling, the back box 102 may also be fully shielded by the back cover 103. In some embodiments, the back cover 103 may include an opening for exposing a camera 121b or an output module. audio 152b. The housings 101, 102, 103 may be formed of injection-molded synthetic resin or may be formed of a metal, for example, stainless steel (STS), aluminum (Al), titanium (Ti) or a similar metal.
[0020] As an alternative to the example in which the plurality of housings form an internal space for housing components, the mobile terminal 100 may be configured such that a housing forms the internal space. In this example, a mobile terminal 100 having a single body is formed in such a manner that the synthetic resin or metal extends from a side surface to a back surface.
[0021] If desired, the mobile terminal 100 may include a watertight unit (not shown) to prevent the ingress of water into the body of the terminal. For example, the watertight unit may include a watertight member that is located between the window 151a and the front housing 101, between the front housing 101 and the rear housing 102, or between the rear housing 102. and the back cover 103 for hermetically sealing an internal space when these housings are assembled. Figures 1B and 1C show some components as they are arranged on the mobile terminal. However, it should be understood that other arrangements are possible and remain in the teachings of this disclosure. Some components may be omitted or arranged differently. For example, the first handling unit 123a may be located on another surface of the terminal body and the second audio output module 152b may be located on the side surface of the terminal body.
[0022] The display unit 151 outputs information processed in the mobile terminal 100. The display unit 151 may be implemented using one or more appropriate display devices. Examples of such suitable display devices include a liquid crystal display (LCD), a thin-film transistor (TFT-LCD) liquid crystal display, an organic light-emitting diode (OLED) flexible display, a three-dimensional display. (3D), an electronic ink display and combinations of these displays. The display unit 151 may be implemented using two display devices which may each implement an identical or different display technology. For example, a plurality of display units 151 may be arranged on one side only, spaced from one another, or these devices may be integrated, or these devices may be arranged on different surfaces. The display unit 151 may also include a touch sensor that detects a touch input received on the display unit. When a touch is made to the display unit 151, the touch sensor can be configured to detect that touch and the controller 180, for example, can generate a control command or other signal corresponding to the touch. The content that is inputted in a tactile manner can be a text or numeric value, or a menu item that can be indicated or designated in various ways.
[0023] The touch sensor can be configured as a film having a tactile structure, disposed between the window 151a and a display on a rear surface of the window 151a, or in the form of a wire which is distributed directly on the rear surface of the window 151a. Alternatively, the touch sensor can be integrally formed with the display. For example, the touch sensor may be disposed on a substrate of the display or within the display. The display unit 151 may also form a touch screen together with the touch sensor. Here, the touch screen can serve as a user input unit 123 (see Figure 1A). Therefore, the touch screen can replace at least some of the functions of the first handling unit 123a.
[0024] The first audio output module 152a may be implemented as a speaker to output voice audio sounds, alarm sounds, multimedia audio reproduction, and the like.
[0025] The window 151a of the display unit 151 will typically include a port for passing audio sounds generated by the first audio output module 152a. One alternative is to allow the audio sounds to be diffused along an assembly gap between the structural bodies (for example, a gap between the window 151a and the front housing 101). In this case, an independently formed hole for outputting audio sounds may not be visible or otherwise hidden in appearance, further simplifying the appearance and fabrication of the mobile terminal 100. The optical output module 154 may be configured to output a light indicating the generation of an event. Examples of such events include message reception, call waiting reception, missed call, alarm, scheduling deadline, email reception, receipt of information through an application. etc. When a user has controlled a generated event, the controller can control the optical output unit 154 to stop the light output.
[0026] The first camera 121a may process image frames such as still or moving images obtained by the image sensor in a capture mode or in a video calling mode. The processed image frames may be displayed on the display unit 151 or stored in the memory 170. The first and second manipulation units 123a and 123b are examples of the user input unit 123 and may be manipulated by a user to provide an input to the mobile terminal 100. The first and second handling units 123a and 123b may also be referenced in common as a handling portion and may employ any tactile method that allows the user to perform manipulation such as contact, pressure, scrolling, etc. The first and second handling units 123a and 123b can also employ any non-tactile method that allows the user to perform manipulation such as proximity touch, flyby, etc. Figure 1B illustrates the first handling unit 123a as a touch key, but possible variants include a mechanical key, a push button, a touch key, and combinations of these keys. An input received on the first and second handling units 123a and 123b can be used in various ways. For example, the first manipulation unit 123a may be used by the user to provide menu entry, home key, cancel, search, etc., and the second handling unit 123b may be used by the user for providing an input for controlling a volume level outputted from the first or second audio output module 152a or 152b to switch to a touch recognition mode of the display unit 151, or a similar function . In another example of the user input unit 123, a rear input unit (not shown) may be located on the rear surface of the terminal body. The rear input unit may be manipulated by a user to provide an input to the mobile terminal 100. The input may be used in a large number of different ways. For example, the rear input unit may be used by the user to provide an input for start / stop, start, end, scroll, control volume level output of the first or second audio output module 152a or 152b to switch to a touch recognition mode of the display unit 151, etc. The rear input unit may be configured to enable touch input, pressure input, or combinations of these inputs. The rear input unit may be located to partially cover the display unit 151 on the front side in a direction of the thickness of the terminal body. By way of example, the rear input unit may be located on an upper end portion of the rear face of the terminal body so that a user can easily manipulate it with his index when the user holds the body of the terminal with one hand. Alternatively, the rear input unit can be positioned at almost any position on the back side of the terminal body. Embodiments that include the rear input unit may implement some or all of the functionality of the first handling unit 123a in the rear input unit. Thus, in situations where the first handling unit 123a is omitted from the front panel, the display unit 151 may have a larger screen. In another variant, the mobile terminal 100 may include a fingerprint sensor that analyzes a fingerprint of a user. The controller 180 may then use the fingerprint information detected by the fingerprint sensor as part of an authentication procedure. The fingerprint sensor may also be installed in the display unit 151 or implemented in the user input unit 123.
[0027] The microphone 122 is shown at one end of the mobile terminal 100, but other locations are possible. If desired, multiple microphones can be implemented, such an arrangement for receiving stereophonic sounds.
[0028] The interface unit 160 may serve as a path allowing the mobile terminal 100 to interface with external devices. For example, the interface unit 160 may have one or more connection terminals for connecting to another device (for example, a listener, an external speaker, etc.), a port for near-field communication. (For example, an IrDA port, a Bluetooth port, a wireless LAN port, etc.), or a power supply terminal for powering the mobile terminal 100. The interface unit 160 may be implemented under the form of a connection interface for housing an external card, such as a subscriber identification module (SIM), a user identity module (UIIVI) or a memory card for storing information .
[0029] The second camera 121b is shown on the back side of the terminal body and has an image capture direction which is substantially opposite the image capture direction of the first camera unit 121a. If desired, the second camera 121a may alternatively be located at other locations or may be made movable to have an image capture direction different from that shown. The second camera 121b may include a plurality of lenses arranged along a line. The plurality of lenses may also be arranged in a matrix configuration. The cameras can be referenced as "network camera". When the second camera 121b is implemented as a network camera, images can be captured in various ways using the plurality of lenses and the images are of better quality. As can be seen in FIG. 1C, a flash 124 is shown adjacent to the second camera 121b. When an image of a subject is captured with the camera 121b, the flash 124 may illuminate the subject.
[0030] As can be seen in FIG. 1B, the second audio output module 152b may be located on the body of the terminal. The second audio output module 152b may implement stereophonic sound functions in conjunction with the first audio output module 152a, and may also be used for the implementation of a speakerphone mode for telephone communications. At least one antenna for wireless communication may be located on the body of the terminal. The antenna may be installed in the body of the terminal or formed by the housing. For example, an antenna that configures a portion of the broadcast receiving module 111 may be retractable into the body of the terminal. Alternatively, an antenna may be formed with a film attached to an inner surface of the back cover 103, or with a housing that includes a conductive material. A power supply unit 190 for supplying power to the mobile terminal 100 may include a battery 191 which is mounted in the terminal body or detachably coupled to an outer portion of the terminal body. The battery 191 can receive power via a power source cable connected to the interface unit 160. In addition, the battery 191 can be recharged in a wireless manner using a charger wireless. Wireless charging can be implemented by magnetic induction or electromagnetic resonance. The rear cover 103 is shown coupled to the rear housing 102 to shield the battery 191, to prevent detachment of the battery 191 and to protect the battery 191 against external impact or against foreign material. When the battery 191 is detachable from the body of the terminal, the rear housing 103 can be releasably coupled to the rear housing 102. An accessory for protecting an aspect or for assisting or extending the functions of the mobile terminal 100 may also be provided on the mobile terminal 100. An exemplary accessory may be a cover or a case for covering or housing at least one surface of the mobile terminal 100. The cover or case may cooperate with the display unit 151 for extending the functions of the mobile terminal 100. Another example of an accessory is a touch pen to assist or extend a touch input for a touch screen. Other preferred embodiments will be described in more detail with reference to additional figures. It should be understood by those skilled in the art that the present characteristic functions can be realized in several forms without departing from the features of the present invention. If a large amount of image data is saved in a mobile terminal, it may not be easy for a user to quickly find the desired image data. The reason for this is that it is not easy to search for image data by means of a search word. Thus, as a general method of controlling a mobile terminal to read photographs, there is a method of outputting a thumbnail list of a plurality of image data stored in a mobile terminal. If a user watches each thumbnail included in the thumbnail list, the user can search for a desired image by obtaining an approximate image. Until now, although the thumbnail list may be better than a single image playback method, limitations are imposed on reading the image data through the thumbnail list. According to an embodiment of the present invention, there is provided a method of using a thumbnail list which is more advanced than a control method for simply outputting a thumbnail list.
[0031] Control methods according to the present invention will be described hereinafter in detail with reference to the accompanying drawings. Fig. 2 is a flowchart for a method of automatically adjusting a scale of a thumbnail list according to an embodiment of the present invention. And, Figs. 3A-3D are state diagrams for an automatic scale adjustment control method of a thumbnail list according to an embodiment of the present invention. The present invention is described in detail hereinafter with reference to FIG. 2 and FIGS. 3A to 3D. First, a thumbnail list for image data means that each item in a list for image data is displayed as a small preview screen (hereinafter referred to as a thumbnail). as shown in Figs. 3A to 3D, and may have a gate structure. In the thumbnail list, the number of images that can be present on a single line (for example, 1 row in a thumbnail list of a grid structure) may vary depending on the size of a displayed thumbnail. For example, if 4 thumbnails, which are 30 each displayed with 100 pixels, are displayed on a single line, then 8 thumbnails, each of which is displayed with 50 pixels, can be displayed on a single line. Moreover, since the number of the images present in a single column can vary according to a size of a displayed thumbnail, the number of 3022648 29 images that can be displayed on a single screen can possibly vary. In the following detailed description and claims, a measure of how many thumbnails can be displayed on a single screen will be called a scale. In particular, if the number of thumbnails displayed on a single screen is large, it may mean a large scale. On the other hand, if the number of thumbnails displayed on a single screen is small, it may mean a small scale. In a step S201, the controller 180 stores a plurality of image data in the memory 170. In a step S202, the controller 180 outputs a thumbnail list for a plurality of the image data saved to a first scale. With reference to FIGS. 3A to 3D, the thumbnail list is output to the scale scale [FIG. 3A, FIG. 3B, FIG. 3C] while the thumbnail list is taken to a 2nd scale [FIG. 3D]. The thumbnail list shown in Figure 3A includes a plurality of elementary thumbnails 10-1 to 10-12.
[0032] According to one embodiment of the present invention, in the case where a scroll entry is received several times from a user, it is proposed to automatically change a scale of a thumbnail list. The reason for this is that when many scrolls have to be made to find a desired image, it is easier to find the desired image if a scale becomes smaller. In particular, the number of thumbnails displayed on a single screen can still increase if a scale becomes smaller. So, image data can be easily found by scaling. Thus, a control method according to an embodiment of the present invention proposes that a scale be changed only by a scrolling entry of a user. Such a condition for changing a scale will be called a scaling condition. Furthermore, according to the description above, a scaling condition comprises a case in which a scroll input is applied for example several times, the present invention not being limited to this case. For example, the scaling condition may include a case where a scroll speed is equal to or greater than a prescribed speed. In a step 5203, the controller 180 waits for receipt of a scroll input for the output thumbnail list. If the scroll entry is not received, the controller 180 may return to step S202. If the scroll entry is received, the controller 180 may proceed to a step 5204. In step 5204, the controller 180 may move the display of the displayed thumbnail list. Referring to FIGS. 3A-3C, if a scroll input is received multiple times from a user, it can be seen that the thumbnails displayed through the thumbnail list change from elementary thumbnails 10-1 to 10-12. Elementary thumbnails 10-7 through 10-21 in response to the scroll input received. During a step S205, the controller 180 determines whether the scroll input received several times satisfies the scaling condition. As mentioned in the above description, the scaling condition may include a condition for a scroll entry reception count and / or a scaling condition. For example, the scaling condition may include an input condition of at least 3 times a scroll input, a condition that a frame rate is equal to or greater than a predetermined speed, or a combination of These conditions. With reference to FIG. 3C and FIG. 3D, when a scroll input is received, if a scaling condition is satisfied, the controller 180 can display the thumbnail list in a way to change a scale. of the thumbnail list displayed at a 2nd scale from the and 'scale. With the 2nd scale, 20 thumbnails are displayed on a single screen of the thumbnail list. When other scroll entries continue to be received on the 2nd scale, if the scaling condition is satisfied, the 2nd scale may be sequentially changed to a 3rd scale, a 4th scale, and so on. When the scale is changed sequentially, as mentioned in the description above, if the scale of the thumbnail list continues to increase, a size of each elementary thumbnail inevitably continues to decrease. Thus, if the scale of the thumbnail list increases beyond a prescribed scale, one size of each thumbnail becomes too small to the point that a user is unable to distinguish an image through the thumbnail. corresponding. In order to compensate for such a problem, according to one embodiment of the present invention, it is proposed to automatically group a plurality of images together from a prescribed or higher scale. Such an embodiment is described in detail hereinafter with reference to FIGS. 4A to 4C. Figs. 4A, 4B and 4C are diagrams for an example of automatically changing a type of display of a ladder list into a list of 5 types of groupings according to an embodiment of the present invention. As mentioned in the previous description, if a scale of a thumbnail list becomes equal to or greater than a prescribed scale, this has the disadvantage that a user has difficulty in checking a thumbnail. Thus, if a scaling condition continues to be satisfied during a display at a prescribed threshold scale, controller 180 stops scaling and is able to change a type of thumbnail list to a list of types. groupings. In this case, the list of group types means a type as follows. That is, a plurality of image data are assigned to a single group and represented by a list of thumbnails or representative folders for the respective groups. Referring to Figure 4A, a thumbnail list represented at a threshold scale is displayed. In doing so, if a grouping change condition is satisfied, the controller 180 can output a list of group types, as can be seen in FIG. 4C, instead of outputting the list of thumbnails of a type of list. thumbnails. In the grouping type list, grouped image data can be displayed as the first six 40-1 to 40-6 folders. The grouping change condition means a condition for changing a list of thumbnail types to a list of group types. As a detailed example, while a thumbnail list is displayed at a prescribed threshold scale, the group change condition may include a case in which the scaling condition is satisfied. When the grouping change condition is satisfied, if the thumbnail list is changed to the group type list, the controller 180 may represent a case in which a plurality of elementary thumbnails are grouped into the type of thumbnail list. as an animation effect [cf. Figure 4B]. In addition, the grouping of the image data can be performed according to various conditions. For example, in the case where a photograph is taken several times in an activated state of the camera, data of the images taken 3022648 32 can be assigned to a single group. In addition, the grouping can be done with reference to a date or geographical location of the taking of photographs (using a location determination of a mobile terminal with GPS). According to one embodiment of the present invention, a reference for grouping may be unlimited. Furthermore, according to one embodiment of the present invention, an additional search assistance method is provided as well as a scaling control method and a change control method of a display in one. list of group types. Detailed examples describe a method of searching for an image using a face recognition result from automatic face recognition and a search method using a voice instruction received from a user. A detailed embodiment for such a method is described in detail hereinafter with reference to FIGS. 5A and 5B. Figs. 5A and 5B are diagrams for an automatic face recognition control method and then assisting for recognized face image data search according to an embodiment of the present invention. According to the example shown in FIG. 5A, the mobile terminal 100 currently outputs image data in the form of a list of types of groupings.
[0033] The controller 180 activates a camera in advance and it is then proposed to recognize a face in the activated state. The reason for this is described below. First, the controller 180 recognizes that a user's face is approaching. Secondly, if the user's face is approaching, it is proposed to search / filter image data using the recognized face.
[0034] In particular, in the state shown in FIG. 5A, the controller 180 recognizes a user's face through an already activated camera. If the recognized face comes closer than a prescribed distance (or, if a recognized face size becomes larger than a prescribed size), the controller 180 can perform a search / filter on the image data saved to the image. recognized facial help. Next, with reference to FIG. 5B, the controller 180 is able to output the group type list using the searched / filtered image data.
[0035] According to the embodiment described above, the camera is activated in advance. Furthermore, according to one embodiment of the present invention, it is further proposed a suitable time to activate a camera. In general, if a camera is activated, energy to analyze image data received through a camera module may be required as well as energy to operate said corresponding module. If these energies are consumed for a useless case, this can cause an unnecessary waste of energy. Thus, according to one embodiment of the present invention, with respect to a suitable time to activate a camera, it is proposed to activate the camera if a scale is changed to be equal to or greater than a prescribed scale. The reason for this is that, if a scale is changed to equal or greater than a prescribed scale, a different method may be required as well as a search method through a thumbnail list. Thus, according to a scaling condition, in the case of a display at a scale equal to or greater than a prescribed scale, the controller 180 can recognize a user's face by activating a camera. In the following description, an assist method for searching image data using voice recognition is provided and described with reference to Figs. 6A and 6B.
[0036] Figs. 6A and 6B are diagrams for an assist control method for searching image data using voice recognition according to an embodiment of the present invention. According to the example shown in FIG. 6A, the mobile terminal 100 currently outputs image data in the form of a list of grouping types.
[0037] The controller 180 activates voice recognition in advance and then proposes to recognize a voice using the recognized voice. In particular, in the state shown in FIG. 6A, the controller 180 recognizes a user voice via an already activated microphone and is then able to perform a search / filter on the image data saved to the help of the recognized voice. Then, with reference to FIG. 6B, the controller 180 is able to output the group type list using the searched / filtered image data.
[0038] Furthermore, according to one embodiment of the present invention, there is provided a detailed search method based on the recognized voice. First, if a prescribed search word is entered through voice recognition, the controller 180 may search for image data keyword information. In particular, if the prescribed search word includes a search word indicating a specific location or location, the controller 180 may search for location information keywords of the image data. More particularly, if a "sea" search word is recognized, the controller 180 searches for the location information keywords of the image data 10 for a photographed image data near the sea. and is then able to provide the found image data. Furthermore, according to an embodiment of the present invention, while a thumbnail list is output, it is proposed to output a detailed image data in response to an input of a prescribed tactile gesture. Such an embodiment is described in detail hereinafter with reference to Figs. 7A and 7B. Figs. 7A and 7B are diagrams for a control method for outputting detailed image data during the output of a thumbnail list according to an embodiment of the present invention. With reference to FIG. 7A, the mobile terminal 100 currently outputs a list of thumbnails at a prescribed scale via the touch screen 151. The larger the scale of the thumbnail list becomes, the larger the size of each thumbnail elementary becomes small. Thus, this has the disadvantage that a user is unable to clearly confirm a detailed image data. Therefore, according to one embodiment of the present invention, there is provided a method of controlling confirmation of a detailed image data in response to an input of a user's prescribed touch gesture. While the thumbnail list is output, if the controller 180 receives an application input of a touch 10a at a prescribed point and then an application of a drag 10b in a direction from left to right while maintaining the 10a [figure 7A], the controller 180 may output a screen for reading image data in detail instead of outputting the thumbnail list [Fig. 7B]. In particular, a plurality of the displayed image data shown in Fig. 7B may include a plurality of image data associated with the touch point 10a (e.g., a plurality of image data associated with a selected thumbnail selected by touch 10a, a plurality of image data having the same attribute as the selected elementary vignette, etc.). For example, if the touch 10a is applied to a prescribed thumbnail, the controller 180 can provide a read screen for the image data grouped with the prescribed thumbnail. In particular, the controller 180 can output a detailed display screen of a group corresponding to the prescribed sticker. Image data having the same attribute as the selected elementary vignette can mean image data having key word information with the same shooting date, time and location as the selected thumbnail. . According to the previous embodiments, the list of types of groupings is described. Another example of the list of types of groupings is described in detail below with reference to FIGS. 8A to 8C.
[0039] Figs. 8A, 8B and 8C are diagrams for an example of a list of types of groupings according to one embodiment of the present invention. Referring to Figure 8A, each item in a list of group types may indicate a group thumbnail that represents each group. And, group thumbnails can have different sizes respectively. Also, each of the group thumbnails may include a preview of a representative image data among a plurality of image data included in the corresponding group of a combination of previews of a plurality of image data. The representative image data may comprise an image data having the largest number of occurrences or the last image data or image data having the largest keyword information, among images belonging to the a corresponding group. A size of a group thumbnail can be determined by various references. For example, a group with more keyword information may have a larger size than a group thumbnail. In another example, a size of a group thumbnail may be different depending on the number of images in a corresponding group. In yet another example, a size of a group thumbnail may increase in proportion to the number of occurrences of an image included in a corresponding group.
[0040] Each group thumbnail may include a shooting date and / or time information 81-1 on a plurality of image data included in the corresponding group thumbnail. In the case where a first group thumbnail 80-1 is selected [FIG. 8B], it is possible to output a thumbnail list for a plurality of image data included in the first selected group thumbnail [FIG. 8C] . Thus, if a prescribed group thumbnail is selected, it is possible to read a plurality of image data belonging to a corresponding group. Until now, when a type of display of image data is changed, this can cause discomfort to a user. Therefore, according to one embodiment of the present invention, there is provided a simple control method which is a method of controlling read image data included in a group thumbnail. Such an embodiment is described in detail hereinafter with reference to FIGS. 9A to 9C.
[0041] Figs. 9A, 9B and 9C are diagrams for a control method for easy reading of a plurality of image data included in a group thumbnail according to an embodiment of the present invention. Referring to Figure 9A, a list of grouping types is commonly displayed. If an input of a touch 10d and a drag 10e is applied to a first group thumbnail 80-1, the controller 180 can sequentially display previews of a plurality of image data corresponding to the first thumbnail of group 80-1 on a 10th slip ride. With reference to FIG. 9B, if the touch 10d moves along the drag path 10e, a plurality of the image data 90-1 to 90-4 corresponding to the 1st group thumbnail 80-1 can be displayed sequentially. on the road. If the drag 10e continues, with reference to FIG. 9C, the controller 180 can increase the number of image data displayed on the path. Figs. 10A and 10B are diagrams for a control method for a case of selecting a displayed image data according to an embodiment of the present invention. Referring to FIG. 10A, in response to a touch-swipe input applied to a first group thumbnail, the mobile terminal 100 commonly displays previews for a plurality of image data.
[0042] If a prescribed image data 90-3 is selected from a plurality of the displayed image data, with reference to FIG. 10B, the controller 180 may display a detailed display screen 1001 of the selected image data 90. 3.
[0043] Furthermore, according to the foregoing embodiment described with reference to Figs. 10A and 10B, when a user releases the touch of the touch-sensitive input, a plurality of the image data continues to be displayed, the present invention not being limited by this function. For example, when the touch is released, a plurality of image data may stop being displayed.
[0044] Furthermore, according to one embodiment of the present invention, image data may be displayed on a card using a location information keyword of the image data. Such an embodiment is described in detail hereinafter with reference to FIGS. 11A to 11C. Figs. 11A, 11B and 11C are diagrams for a control method for reading image data on a card according to an embodiment of the present invention. Referring to Figs. 11A-11C, with respect to each image data, location information on a photographed location of the corresponding image data can be saved as a location information keyword. . Thus, a prescribed indicator is output for a location of a location information keyword on an exit map through the touch screen 151. If the prescribed indicator is selected, it is possible to exit image data having the location information keyword of the corresponding location.
[0045] Referring to Fig. 11A, the mobile terminal 100 commonly outputs a card through the touch screen 151. And, location indicators 11-1 to 116, each of which indicates that there is a data item. image having a location information keyword of a corresponding point, are taken out on the map. When a photograph is taken at a location corresponding to each of the location indicators 11-1 to 11-6, if the mobile terminal 100 moves along a prescribed path, it is possible to output an indicator further. 1101 indicating that the mobile terminal 100 has moved.
[0046] After a prescribed location indicator 11-6 has been selected, if an input of a touch 10h and a slip 10j is received, the controller 180 may sequentially output a plurality of image data having a location information keyword of the selected location indicator 11-6 along a drag path 10j. In this case, a plurality of the image data can be sequentially output with a distance of the applied drag 10j. On the other hand, an automatic scale adjustment control method mentioned in the following description can be applied to various kinds of lists as well as to a list of thumbnails of image data. Such embodiments are described in detail hereinafter with reference to Figs. 12A-12C and Figs. 13A-13C. Figs. 12A, 12B and 12C are diagrams for a method of controlling change of a scale or type of display in response to a scrolling entry on a playback screen of a news item according to a Embodiment of the present invention. With reference to FIG. 12A, the mobile terminal 100 currently outputs a prescribed news item through the touch screen 151. If a scroll instruction for the news item 1201 is received, with reference to FIG. 12B, the controller 180 may scroll to move the news item 1201.
[0047] If the scrolling instruction received several times satisfies a grouping condition, with reference to FIG. 12C, the controller 180 can output a list of news items grouped by theme. According to the example shown in FIG. 12C, the displayed list may comprise elements 1202-1 to 1202-3 corresponding to the news item.
[0048] The clustering condition may be installed with reference to a scroll input count and / or a scroll speed as for the above-mentioned scaling condition. Figs. 13A-13C are diagrams for a method of controlling change of scale or type of display in response to a scroll input on a mail list screen according to an embodiment of the present invention. present invention. With reference to FIG. 13A, the mobile terminal 100 currently outputs a list of emails via the touch screen 151. If a scroll instruction for the mail list is received, with reference to FIG. 13B the controller 180 may scroll to move the list of emails. If the scrolling instruction received several times satisfies a grouping condition, with reference to FIG. 13C, the controller 180 can output a list of grouped emails by theme. According to the example shown in FIG. 13C, the displayed list may comprise elements 14-1 to 14-3 respectively corresponding to the themes. On the other hand, if a grouping condition is satisfied, the email list can be changed to a list of grouping types. In this way, the controller 180 can output an animation effect during the change of the email list to the list of types of groupings [cf. Figure 13B]. Furthermore, according to an embodiment of the present invention, there is provided a control method for entering a search word to more easily search for image data. Such an embodiment is described in detail hereinafter with reference to FIGS. 14A and 14B. Figs. 14A and 14B are diagrams for a control method for entering a search word to search for image data through a touch input according to an embodiment of the present invention. With reference to FIG. 14A, the controller 180 routinely outputs prescribed text data via the touch screen 151. If a prescribed word selection input from the text data is received, the controller 180 can search for image data using the text data selected as the search word. In this case, the search method of image data using the search word may be identical to the previous method using voice recognition. For example, if a "City Hall" text is selected from the output text data, the controller 180 searches for / filters the image data having a location information keyword of "City Hall" and is then able to output a corresponding result as a list of group types [FIG. 14B]. Referring to FIG. 14B, three first group thumbnails 1401-1 to 1401-3 for the image data with the location information keyword of "City Hall" are output.
[0049] According to the above-mentioned embodiment, a textual data display case is taken by way of example, the present invention not being limited by this case. For example, the present invention is applicable to a text recognition case present on image data displayed through optical character recognition. On the other hand, when another image data is read, if a recognized face is selected from the corresponding image data, the controller 180 searches for a contact corresponding to the selected face and is then able to provide the contact found to a user. (For example, contact information is displayed on the touch screen). Then the user can make a phone call with the corresponding contact or send a text message to the corresponding contact. Accordingly, embodiments of the present invention provide various effects and / or various characteristic functions.
[0050] According to at least one of the embodiments of the present invention, an image data search can be facilitated. According to at least one of the embodiments of the present invention, a scrolling operation can be performed adaptively during scrolling of a screen output via a touch screen.
[0051] Various embodiments may be implemented using a machine readable medium on which instructions for execution by a processor are stored to perform various methods presented herein. Examples of possible machine-readable media include hard disk drives (HDD), electronic disks (SSD), silicon disk drives (SDD), ROMs, RAMs, CD-ROMs, magnetic tape, floppy disk, an optical data storage device, the other types of storage media presented herein and combinations thereof. If desired, the machine-readable medium may be embodied as a carrier wave (e.g., transmission over the Internet). The processor may comprise the controller 180 of the mobile terminal.
[0052] The aforementioned embodiments are given primarily by way of example and should not be construed as limiting the present invention. The present teachings can be readily applied to other types of methods and apparatus. This description is intended to be illustrative and not to limit the scope of the claims. Many variations, modifications and variations will be obvious to those skilled in the art. The feature functions, structures, methods, and other features of the exemplary embodiments described herein may be combined in various ways to provide examples of alternative or supplemental embodiments. Since the present characteristic functions can be realized in several forms without departing from their characteristics, it should also be understood that the embodiments described above are not limited by any of the details of the foregoing description unless otherwise indicated, but rather, they should be considered broadly within their scope as defined in the appended claims and, therefore, any changes and modifications that fall within the scope and limits of the claims, or equivalents of such boundaries and limits, are therefore made with the intention of being included in the appended claims.
权利要求:
Claims (15)
[0001]
REVENDICATIONS1. A mobile terminal (100) comprising: a memory (170) configured to store image data; a touch screen (151); and a controller (180) configured to: cause the touch screen (151) to display a plurality of thumbnail images of a thumbnail image list based on the stored image data, the plurality of thumbnail images being displayed on a first scale; scrolling the plurality of thumbnail images when a first input received on the touch screen (151) satisfies a first condition; and changing the displayed scale from the plurality of thumbnail images of the first scale to a second scale when a second input received on the touch screen (151) satisfies a second condition.
[0002]
The mobile terminal (100) of claim 1, wherein the controller (180) is further configured to: cause the touch screen (151) to stop displaying the plurality of thumbnail images and display a group of image data prescribed in response to a touch gesture received on the touch screen (151).
[0003]
The mobile terminal (100) of claim 2, wherein the touch gesture comprises a touch input received on a displayed location of an image of the plurality of thumbnail images and a drag entry extending from the input touch in a direction from right to left.
[0004]
The mobile terminal (100) according to claim 3, wherein the group of prescribed image data comprises a set of at least one image having the same attribute as that of the image.
[0005]
The mobile terminal (100) of claim 1 or 2, wherein the controller is further configured to: assign the stored image data to at least one group; and to cause the touch screen (151) to stop displaying the plurality of thumbnail images and display a list of groups of the at least one group when an input received on the touch screen (151) comprises multiple entries collectively satisfying a defined change condition. 5
[0006]
The mobile terminal (100) of claim 5, wherein the controller (180) is further configured to: cause the touch screen (151) to display a preview of at least one image assigned to a group of the list of groups on a path of a drag entry 10 that is applied to the group of the group list.
[0007]
The mobile terminal (100) of claim 5, wherein the change condition includes a condition and additional satisfaction of the first change condition during the display of the plurality of thumbnail images at the first scale.
[0008]
The mobile terminal (100) according to one of claims 1, 2 and 5, wherein the first change condition comprises a reception of the first input which is a scroll input received for a number of times which is equal to or greater than a prescribed number of times and / or a condition that the first input is a scroll entry having a speed that is equal to or greater than a prescribed speed.
[0009]
9. A method performed by mobile terminal (10), the method comprising: displaying (S201), on a touch screen (151), a plurality of thumbnail images of a list of thumbnail images on the base stored image data, the plurality of thumbnail images being displayed at a first scale; scrolling (S203, S204) the plurality of thumbnail images when a first input received on the touch screen (151) satisfies a first condition; and changing (S205, S206) the displayed scale of the plurality of thumbnail images from the first scale to a second scale when a second input received on the touch screen (151) satisfies a second condition. 3022648 44
[0010]
The method of claim 9, further comprising: stopping the display of the plurality of thumbnail images and displaying a group of prescribed image data in response to a touch gesture received on the touch screen (151). ). 5
[0011]
The method of claim 10, wherein the touch gesture comprises a touch input received at a displayed location of an image of the plurality of thumbnail images and a drag entry extending from the touch input in a direction. from right to left. 10
[0012]
The method of claim 11, wherein the group of prescribed image data comprises a set of at least one image having the same attribute as that of said one image.
[0013]
The method of claims 9 or 10, further comprising: assigning stored image data to at least one group; and stopping displaying the plurality of thumbnail images and displaying a group list of at least one group when an input received on the touch screen (151) comprises multiple inputs collectively satisfying a change condition defined. 20
[0014]
The method of claim 13, further comprising: displaying, on the touch screen (151), a preview of at least one image assigned to a group of the list of groups on a path of an input of drag that is applied to the group of the group list. 25
[0015]
The method of claim 13, wherein the change condition includes a condition and an additional satisfaction of the first change condition during the display of the plurality of thumbnail images at the first scale.
类似技术:
公开号 | 公开日 | 专利标题
FR3021133B1|2019-08-30|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL
FR3031601B1|2019-08-30|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3022368B1|2019-06-21|WATCH-TYPE TERMINAL AND CONTROL METHOD THEREOF
FR3022648A1|2015-12-25|
FR3021424B1|2019-09-20|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL
FR3025328B1|2019-07-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
EP3122024B1|2018-09-19|Mobile terminal and controlling method thereof
FR3026201A1|2016-03-25|
FR3021766A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL
FR3022649A1|2015-12-25|
FR3024786A1|2016-02-12|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME
FR3021136A1|2015-11-20|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3021425A1|2015-11-27|
FR3043478A1|2017-05-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3022367A1|2015-12-18|
FR3039673A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3021135A1|2015-11-20|
FR3029309A1|2016-06-03|
FR3046470B1|2019-11-08|MOBILE TERMINAL
FR3028630A1|2016-05-20|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME
FR3041785A1|2017-03-31|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3019665A1|2015-10-09|
FR3021485A1|2015-11-27|MOBILE DEVICE AND METHOD OF CONTROLLING THE SAME
FR3039674A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3042084B1|2019-11-08|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
同族专利:
公开号 | 公开日
CN105306625A|2016-02-03|
US20150370424A1|2015-12-24|
KR20150145400A|2015-12-30|
KR102225943B1|2021-03-10|
EP2958005A3|2016-01-13|
US9864486B2|2018-01-09|
EP2958005B1|2017-08-16|
FR3022648B1|2018-08-31|
EP2958005A2|2015-12-23|
CN105306625B|2020-03-03|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
US20110126148A1|2009-11-25|2011-05-26|Cooliris, Inc.|Gallery Application For Content Viewing|
US20120032988A1|2010-08-04|2012-02-09|Canon Kabushiki Kaisha|Display control apparatus that displays list of images onto display unit, display control method, and storage medium storing control program therefor|
US6121969A|1997-07-29|2000-09-19|The Regents Of The University Of California|Visual navigation in perceptual databases|
KR100461019B1|2002-11-01|2004-12-09|한국전자통신연구원|web contents transcoding system and method for small display devices|
US7818658B2|2003-12-09|2010-10-19|Yi-Chih Chen|Multimedia presentation system|
US8587614B2|2007-12-10|2013-11-19|Vistaprint Schweiz Gmbh|System and method for image editing of electronic product design|
US8799820B2|2008-12-23|2014-08-05|At&T Mobility Ii Llc|Dynamically scaled messaging content|
KR101640463B1|2009-05-19|2016-07-18|삼성전자 주식회사|Operation Method And Apparatus For Portable Device|
EP2572269A1|2010-05-21|2013-03-27|TeleCommunication Systems, Inc.|Personal wireless navigation system|
KR102014273B1|2011-02-10|2019-08-26|삼성전자주식회사|Portable device having touch screen display and method for controlling thereof|
US20130069893A1|2011-09-15|2013-03-21|Htc Corporation|Electronic device, controlling method thereof and computer program product|
US9158445B2|2011-05-27|2015-10-13|Microsoft Technology Licensing, Llc|Managing an immersive interface in a multi-application immersive environment|
US8933971B2|2011-09-12|2015-01-13|Microsoft Corporation|Scale factors for visual presentations|
KR20130063196A|2011-12-06|2013-06-14|현대자동차주식회사|A divided screen interlocking control method and apparatus thereof using dynamic touch interaction|
AU2011265428B2|2011-12-21|2014-08-14|Canon Kabushiki Kaisha|Method, apparatus and system for selecting a user interface object|
US9591181B2|2012-03-06|2017-03-07|Apple Inc.|Sharing images from image viewing and editing application|
US9547647B2|2012-09-19|2017-01-17|Apple Inc.|Voice-based media searching|
KR102099646B1|2012-09-25|2020-04-13|삼성전자 주식회사|Apparatus and method for switching an application displayed split view in portable terminal|
KR102079174B1|2012-10-15|2020-02-19|삼성전자 주식회사|Apparatus and method for displaying information in portable terminal device|
US9465521B1|2013-03-13|2016-10-11|MiMedia, Inc.|Event based media interface|USD760771S1|2014-02-10|2016-07-05|Tencent TechnologyCompany Limited|Portion of a display screen with graphical user interface|
USD760770S1|2014-02-10|2016-07-05|Tencent TechnologyCompany Limited|Portion of a display screen with animated graphical user interface|
WO2016039570A1|2014-09-12|2016-03-17|Samsung Electronics Co., Ltd.|Method and device for executing applications through application selection screen|
KR20170016165A|2015-08-03|2017-02-13|엘지전자 주식회사|Mobile terminal and method for controlling the same|
CN105931187B|2016-04-29|2019-06-14|北京小米移动软件有限公司|Image processing method and device|
JP2020507174A|2017-01-13|2020-03-05|リンゴジン ホールディング リミテッドLingoZING Holding LTD|How to navigate the panel of displayed content|
KR20180086078A|2017-01-20|2018-07-30|삼성전자주식회사|Electronic apparatus and method for displaying screen thereof|
CN108196675B|2017-12-29|2021-04-16|珠海市君天电子科技有限公司|Interaction method and device for touch terminal and touch terminal|
KR102225065B1|2018-09-05|2021-03-10|수산씨엠씨|Personal disinfection system based internet of things|
法律状态:
2016-05-30| PLFP| Fee payment|Year of fee payment: 2 |
2017-05-30| PLFP| Fee payment|Year of fee payment: 3 |
2018-02-09| PLSC| Search report ready|Effective date: 20180209 |
2018-05-29| PLFP| Fee payment|Year of fee payment: 4 |
2019-04-10| PLFP| Fee payment|Year of fee payment: 5 |
2020-04-08| PLFP| Fee payment|Year of fee payment: 6 |
2021-04-09| PLFP| Fee payment|Year of fee payment: 7 |
优先权:
申请号 | 申请日 | 专利标题
KR1020140074705A|KR102225943B1|2014-06-19|2014-06-19|Mobile terminal and method for controlling the same|
KR20140074705|2014-06-19|
[返回顶部]